The battle against machines is still raging on at full throttle. What initially began as a general dissatisfaction with productive AI has now turned into a full-scale assault by artists. But why are artists resorting to such measures? What are their reasons for blinding artificial intelligence? Here are all the answers…
Why are artists trying to poison artificial intelligence?
The root of this animosity lies in artificial intelligence being trained on original images from platforms without the consent of the artists. Specifically, some newly developed products aim to confuse AI into making mistakes.
So, artificial intelligence is being “poisoned” with “artwork” data being released. For instance, Nightshade, a free software developed by researchers at the University of Chicago, has been downloaded by more than 250,000 people since its release in January.
In a VentureBeat article on the matter, it states, “The tool essentially works by pitting AI against AI,” meaning another AI is used to blind artificial intelligence. Platforms like DALL-E and Midjourney AI gather data from the web, including commercial art.
According to reports from Midjourney, it has trained on over 100 million images without permission from the original creators. It’s important to clarify that this doesn’t necessarily mean “theft.” Training data is fragmented and then reconstructed when needed, meaning the output won’t be an exact replica but something similar to an existing artwork.
How does the data poisoning tool, Nightshade, work?
This AI software works by using labels to misidentify objects in images, “subtly altering the image at the pixel level to make other AI programs see something different from reality.” It’s effectively being poisoned!
The outcome is likely to produce results that don’t match user instructions. “For example, while human eyes may see a picture of a cow in a green field, an AI model may see a large leather bag lying on the grass,” explains the source. The team behind introducing Nightshade also released another AI-blocking tool called Glaze about a year ago.
This tool is designed to conceal the artist’s style, making it a nightmare for artists as productive AI enters the scene. According to developers, the solution could be for AI companies to license images from artists rather than obtaining them freely from the web.
What are your thoughts on this issue? You can share your opinions in the comments section below.
{{user}} {{datetime}}
{{text}}